fight bias
Howard discusses Sex, Race, and Robotics and how to fight bias in AI
BEGIN ARTICLE PREVIEW: Listen to this article Ayanna Howard with a Dynamic Anthropomorphic Robot with Intelligence-Open Platform (DARwIn-OP). Source: Rob Felt, Georgia Institute of Technology Headlines regularly proclaim that robots are coming for people’s jobs or are “creepy,” but both robotics developers and the general public are increasingly aware of the many ways in which the technology can boost productivity and safety. However, the need to understand how robots and artificial intelligence can inherit negative human biases is still urgent, according to roboticist Ayanna Howard. “Bias in AI is the responsibility of the designer,” said Howard, who recently published the book Sex, Race, and Robots: How to Be Human in the A
Startup launches world's first genderless AI to fight bias in smart assistants
Talk to Apple's Siri or Amazon's Alexa and you'll notice a common trait: They both have female voices. While this can help make robotic assistants more relatable and natural to converse with, it has assigned a gender to a technology that's otherwise genderless. Now, researchers are hoping to offer a new alternative by launching what they're calling the world's first'genderless voice.' To create'Q', researchers recorded voices from participants who identify as non-binary, or neither exclusively female nor male. Researchers then tested the voice on 4,600 people across Europe.
- Europe > Denmark > Capital Region > Copenhagen (0.06)
- North America > United States > California (0.05)
Justice Can't Be Colorblind: How to Fight Bias with Predictive Policing
Originally published by Scientific American. Law enforcement's use of predictive analytics recently came under fire again. Dartmouth researchers made waves reporting that simple predictive models--as well as nonexpert humans--predict crime just as well as the leading proprietary analytics software. That the leading software achieves (only) human-level performance might not actually be a deadly blow, but a flurry of press from dozens of news outlets has quickly followed. In any case, even as this disclosure raises questions about one software tool's credibility, a more enduring, inherent quandary continues to plague predictive policing.
- North America > United States > Wisconsin (0.05)
- North America > United States > New York (0.05)
- Information Technology > Artificial Intelligence (1.00)
- Information Technology > Data Science > Data Mining (0.97)